Minerva-1B-base-v1.0 is a 1-billion-parameter Italian-English bilingual large language model jointly developed by Sapienza NLP with FAIR and CINECA, trained on 200 billion tokens (100 billion each for Italian/English)
Large Language Model
Transformers Supports Multiple Languages